Boolean functions: noise stability, non-interactive correlation, and mutual information
نویسندگان
چکیده
Let ǫ ∈ [0, 1/2] be the noise parameter and p > 1. We study the isoperimetric problem that for fixed mean Ef which Boolean function f maximizes the p-th moment E(Tǫf) p of the noise operator Tǫ acting on Boolean functions f : {0, 1} n 7→ {0, 1}. Our findings are: in the low noise scenario, i.e., ǫ is small, the maximum is achieved by the lexicographical function; in the high noise scenario, i.e., ǫ is close to 1/2, the maximum is achieved by Boolean functions with the maximal degree-1 Fourier weight; and when p is a large integer, the maximum is achieved by some monotone function, which particularly implies that, among balanced Boolean functions, the maximum is achieved by any function which is 0 on all strings with fewer than n/2 1’s. Our results recover Mossel and O’Donnell’s results about the problem of non-interactive correlation distillation, and confirm Courtade and Kumar’s Conjecture on the most informative Boolean function in the low noise and high noise regimes. We also observe that Courtade and Kumar’s Conjecture is equivalent to that the dictator function maximizes E(Tǫf) p for p close to 1.
منابع مشابه
On canalizing Boolean functions
Boolean networks are an important model of gene regulatory networks in systems and computational biology. Such networks have been widely studied with respect to their stability and error tolerance. It has turned out that canalizing Boolean functions and their subclass, the nested canalizing functions, appear frequently in such networks. These classes have been shown to have a stabilizing effect...
متن کاملComments and Corrections Comments on “Canalizing Boolean Functions Maximize Mutual Information”
In their recent paper “Canalizing Boolean Functions Maximize Mutual Information,” Klotz et al. argued that canalizing Boolean functions maximize certain mutual informations by an argument involving Fourier analysis on the hypercube. This note supplies short new proofs of their results based on a coupling argument and also clarifies a point on the necessity of considering randomized functions.
متن کاملCanalizing Boolean Functions Maximize the Mutual Information
The ability of information processing in biologically motivated Boolean networks is of interest in recent information theoretic research. One measure to quantify this ability is the well known mutual information. Using Fourier analysis we show that canalizing functions maximize the mutual information between an input variable and the outcome of the function. We proof our result for Boolean func...
متن کاملMutual Information Functions Versus Correlation Functions
This paper studies one application of mutual information to symbolic sequence: the mutual information function M(d). This function is compared with the more frequently used correlation function (d). An exact relation between M(d) and (d) is derived for binary sequences. For sequences with more than two symbols, no such general relation exists; in particular, (d) = 0 may or may not lead to M(d) ...
متن کاملGate-level Synthesis of Boolean Functions using Information Theory Concepts
In this paper we apply information theory concepts to evolutionary Boolean circuit synthesis. We discuss the schema destruction problem when simple conditional entropy is used as fitness function. The design problem is the synthesis of Boolean functions by using the minimum number of binary multiplexers. We show that the fitness landscape of normalized mutual information exhibits better charact...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1801.04462 شماره
صفحات -
تاریخ انتشار 2018